Skip to main content

Using Data

SERVICEME Analytics is a self-service data analysis tool designed for non-technical users. It enables users to create, manage, and share data visualization content without requiring programming skills, through natural language queries and intelligent chart generation.

Objectives

  • Quickly generate data charts through natural language database queries
  • Provide diverse chart styles to meet various data analysis needs
  • Support user-level and enterprise-level chart libraries for personalized and shared management
  • Enhance data analysis capabilities for non-technical users, simplifying data querying and visualization processes

AI Report Generation Guide

  1. Access the Report Generation Page
    • Navigate to Analytics
    • Click the top-left AI Report Generation
    • Enter the Report Generation Page

Dashboard

  1. Select Data Source and Data
    • Choose a data source (e.g., "Sample Data Source")
    • Select specific data from the data source (e.g., Sales.OrderLines)

Dashboard

  1. Generate Intelligent Charts
    • Select Intelligent Chart Generation
    • Use recommended questions or input queries in natural language (e.g., "Analyze sales revenue for each product")
    • Click Query and wait for the chart to be generated

Dashboard

  1. Modify Chart Style

    • For example:
      • Change chart type (e.g., Pie Chart)
      • Adjust report colors (e.g., Blue)
  2. Publish the Report

    • Click the Publish button in the top-right corner
    • Choose the publication location:
      • My Analysis (accessible in the dashboard)
      • Analysis Center (accessible in the report center)

Dashboard


AI Reports

AI Reports are tools for report generation based on data processing and intelligent analysis capabilities.

Advantages of AI Reports:

  • Efficient and Convenient: Quickly generate reports by inputting requirements without complex operations, reducing production cycles.
  • Intelligent Analysis: System-recommended questions and dynamic insights generated by large models provide data interpretation and analysis, lowering the barrier to data analysis.
  • High Flexibility: Supports both system recommendations and custom input needs, with style editing features for personalized display.
  • Collaboration and Sharing: Enables publishing and sharing for team collaboration and result reuse, enhancing data value transmission efficiency.
  • Data Insights: Using the "Insights" feature, reports in both "Dashboard" and "Report Center" can yield conclusions such as trend analysis, correlation analysis, variance analysis, and summaries.

AI Reports


Data Portal

Dashboard

The "Dashboard" is a personal space for managing reports. Once a report is published to "My Analysis," users can view it directly in the dashboard. Its core functions cover the entire lifecycle of report management, including editing content, modifying styles, downloading for retention, sharing with others, and deletion, fully meeting personalized report management needs.

Advantages of the Dashboard:

  • Strong Management Autonomy: Users have high autonomy over reports in the dashboard, from content editing to style adjustments, downloading, sharing, and deletion, catering to diverse management needs.
  • Centralized Data Asset Management: As a unified management space, the dashboard enables centralized storage of reports, allowing users to quickly locate and utilize reports published to "My Analysis," improving data asset usage efficiency.
  • Flexible Publishing Mechanism: Linked with the report publishing process, supports viewing reports published to "My Analysis" in the dashboard, while integrating with "Analysis Center" and other publishing features to form a seamless "Publish - Manage - Use" workflow.

Dashboard

Report Center

The Report Center is a public storage and exchange platform with the following main functions:

  • Report Storage and Management: Users can publish reports to the Analysis Center for quick retrieval and reference later.
  • Report Bookmarking: Users can bookmark reports for easy access and reference later.
  • Report Resource Sharing: Helps build a report resource library, facilitating learning and reference among users.

Dashboard

Data Source

In the process of AI report creation, data sources play a critical role. AI algorithms extract, clean, and integrate data from these sources to obtain comprehensive and accurate information.

💡 Tip: Typically, data sources are added by administrators in the data management section. Regular users cannot add data sources.

Dashboard


Data Governance

Data Catalog

The Data Catalog serves as the central hub for data management, providing a comprehensive and intuitive overview of data assets. It includes precise statistics on imported data, data interfaces, data subscriptions, and data queries, along with information on recent updates and queries.

Dashboard

The Data Catalog also supports searching data assets based on keywords, asset types, data sources, and other criteria.

Dashboard

The module allows users to view detailed metadata information, including table names, display names, field names, and comments. This metadata provides a comprehensive understanding of data characteristics and origins, supporting further analysis and application.

Dashboard

Using data from the catalog, users can interact with data through the Copilot assistant using natural language. Simply input a question, and the system will automatically parse it and extract answers from relevant data, presenting them intuitively.

Dashboard

Metadata Management

Data Source Integration

The data source integration interface offers two methods for integrating data sources:

  • SQL Server: Integrate widely-used SQL Server databases by configuring server address, database name, username, and password. Once a stable connection is established, tables, views, and other objects can be accessed for data analysis.
  • Azure Databricks: Azure Databricks is a cloud-based analytics platform suitable for processing large-scale datasets. Using its API or connectors, data scattered across different storage locations (e.g., Azure Blob Storage, Data Lake) can be consolidated into the Databricks environment for transformation, analysis, and mining.

SERVICEME supports tagging data sources, which aids in quickly filtering and locating data sources, playing a vital role in data governance and metadata management.

For integrated data sources, synchronization functionality ensures data timeliness and accuracy. When data sources are updated, manual synchronization can be triggered to reflect updates in SERVICEME.

Dashboard

💡 Tip: Regular users can also integrate data sources.

Data Collection

In the data collection module, importing table data is a core feature, allowing external table data to be easily introduced into the system for subsequent processing and analysis.

Before importing, a new template must be created in template management. During import, an appropriate template must be selected.

Template creation offers two methods: manual creation or file-based template creation.

Data Collection:

Dashboard

Template Management:

Dashboard


Data Sharing

Data Interface

Data interface management is used to configure API interfaces, including interface names, categories, addresses, database entities, and enablement status. Administrators can manage various data interfaces here to ensure smooth data interaction with the system.

Viewing Data Interfaces

  1. Access the Data Interface Management Page: Navigate to "Data Interface" in system settings.
  2. View Configured Data Interfaces: The page displays all data interface information, including:
    • API Name: The name of the interface for identification.
    • Category: The category of the interface (e.g., case, SQL).
    • Interface Address: The access URL of the API interface.
    • Data Entity: The associated database entity, such as Website.Suppliers.
    • Pagination: Indicates whether the interface supports pagination.
    • Enablement Status: Shows whether the interface is enabled or disabled.
  3. Perform Operations: Administrators can view, edit, or delete each interface, test the interface by clicking "Test," or view interface call logs by clicking "Logs."

Dashboard

Adding a Data Interface

  1. Click "Add" Button: On the data interface management page, click the "Add" button to create a new data interface.
  2. Fill in Interface Information: (Partial data shown)
    • API Name: Assign a unique name to the new interface.
    • Category: Select the category of the interface (e.g., case or SQL).
    • Interface Address: Enter the access URL of the interface.
    • Data Entity: Select or input the associated data entity.
    • Enablement Status: Set the enablement status of the interface.
  3. Click "Save": After filling in all necessary information, click "Save" to successfully create the new data interface.

Dashboard

Dashboard

Dashboard

Deleting a Data Interface

  1. Select the Interface to Delete: In the data interface list, select the interface to delete.
  2. Click "Delete" Button: Confirm the deletion to remove the data interface.

Dashboard

Interface Logs

Interface logs record detailed information about each data interface call, helping administrators troubleshoot and analyze interface usage. Each call generates a log entry containing request method, parameters, call time, duration, and status.

Details of Interface Logs:

  1. Access the Interface Logs Page: On the data interface management page, click "Interface Logs" to view all call records.
  2. View Log Details: The log table displays the following information:
    • API Name: The name of the called interface.
    • Request Method: The HTTP request method used, such as GET or POST.
    • Request Parameters: Parameters passed during the interface call.
    • Calling System: The system used for the call (if recorded).
    • Call Time: The specific time of the interface call.
    • Call Duration: The time taken for the request, e.g., 2.8174ms.
    • Call Status: The status of the call, such as success or failure.
  3. Operations: Each log entry has a "Details" button for viewing more detailed log information to help troubleshoot issues.

Dashboard

Querying Interface Logs

  1. Query by API Name: Input the API name to search for specific interface call logs.
  2. Query by Request Method: Filter log records by request method (e.g., GET or POST).
  3. Query by Call Time: Select a time range to view interface call logs within that period.

Dashboard

Handling Failed Interface Calls

  1. View Failure Status: For calls marked as "Failed," administrators can click "Details" to view the reason for failure, such as timeout or parameter errors.
  2. Troubleshoot Issues: Analyze log information, including request parameters, call duration, and status, to identify and resolve the cause of failure.

Dashboard

Data Subscription

Data subscription allows the SERVICEME platform to actively push data to external systems. Users can set subscription conditions and frequencies, and the system will check for matching data at the trigger time. If matching data exists, it will be pushed to the specified interface. Each subscription includes conditions, push methods, data entities, and other configurations.

Creating a Data Subscription

  1. Access the Data Subscription Page: Navigate to "Data Subscription" in system settings to view and manage subscriptions.
  2. Click "Create" Button: On the data subscription page, click "Create" to start a new subscription.
  3. Fill in Subscription Information:
    • Subscription Name: Assign a unique name to the subscription.
    • Data Source: Select the data source from existing options.
    • Data Entity: Choose the corresponding data entity, such as Sales.OrderLines.
  4. Configure Subscription Rules:
    • Data Fields: Select and configure fields for subscription.
    • Operators: Choose operators for filtering data.
    • Values: Specify filter values; only records matching these values will be pushed.
  5. Set Push Configuration:
    • Push Method: Select the method for pushing data (e.g., Http).
    • Push Frequency: Set the frequency using CRONS (5 digits) expressions.
    • Batch Size: Specify the number of records per push.
    • API: Select the API for pushing data.
    • HTTP Method: Choose the HTTP request method, typically POST or GET.
  6. Set Request Headers and Body:
    • Headers: Configure HTTP headers, such as a fixed api_key for authentication.
    • Body: Define the message format for the receiving system. If not adjusted, all data will be sent as an array under the data attribute in JSON.
  7. Click "Save": After completing all configurations, click "Save" to create the subscription.

Dashboard

Dashboard

Deleting a Data Subscription

  1. Select the Subscription to Delete: In the subscription list, select the subscription to delete.
  2. Click "Delete" Button: Confirm the deletion to remove the subscription.

Dashboard

Push Logs

Push logs record the status of data subscription pushes, helping administrators track successes and failures and address issues promptly.

Viewing Push Logs:

  1. Access the Push Logs Page: On the data subscription page, click "Push Logs" to view all push records.
  2. View Push Information: Push logs display the following details:
    • Name: The name of the data subscription.
    • API: The API associated with the subscription.
    • Request Headers: The headers used in the request.
    • Request Parameters: Parameters passed during the push.
    • Request Time: The time of the push.
    • Push Status: The status of the push (e.g., success, failure).
    • Response Headers: The headers returned in the response.
  3. Query Push Logs: Administrators can search logs by API name, request time, and other criteria.

Dashboard